Partitioned Sparse A-1 Methods
نویسندگان
چکیده
This paper solves the classic Ax=b problem by constructing factored components of the inverses of L and U, the triangular factors of A. The number of additional fill-ins in the partitioned inverses of L and U can be made zero. The number of partitions is related to the path length of sparse vector methods. Allowing some fill-in in the partitioned inverses of L and U results in fewer partitions. Ordering algorithms most suitable for sparsity preservation in the inverses of L and U require additional fill-in in L and U themselves. Tests on practical power system matrices from 118 to 1993 nodes indicate that the proposed approach is competitive in serial environments, and appears more suitable for parallel environments. Because sparse vectors are not required, the approach works not only for shortcircuit calculations but also for power flow and stability computations. Kevwords: Power Flow, Stability, Parallel computing, Sparse Matrices, Linear equations, Partitioning.
منابع مشابه
Incomplete Inverse Preconditioners
Incomplete LU factorization is a valuable preconditioning approach for sparse iterative solvers. An “ideal” but inefficient preconditioner for the iterative solution of Ax = b is A−1 itself. This paper describes a preconditioner based on sparse approximations to partitioned representations of A−1, in addition to the results of implementation of the proposed method in a shared memory parallel en...
متن کاملPartitioned Shape Modeling with On-the-Fly Sparse Appearance Learning for Anterior Visual Pathway Segmentation
MRI quantification of cranial nerves such as anterior visual pathway (AVP) in MRI is challenging due to their thin small size, structural variation along its path, and adjacent anatomic structures. Segmentation of pathologically abnormal optic nerve (e.g. optic nerve glioma) poses additional challenges due to changes in its shape at unpredictable locations. In this work, we propose a partitione...
متن کاملFeature-distributed sparse regression: a screen-and-clean approach
Most existing approaches to distributed sparse regression assume the data is partitioned by samples. However, for high-dimensional data (D N ), it is more natural to partition the data by features. We propose an algorithm to distributed sparse regression when the data is partitioned by features rather than samples. Our approach allows the user to tailor our general method to various distributed...
متن کاملEfficient Distributed Learning with Sparsity
We propose a novel, efficient approach for distributed sparse learning with observations randomly partitioned across machines. In each round of the proposed method, worker machines compute the gradient of the loss on local data and the master machine solves a shifted `1 regularized loss minimization problem. After a number of communication rounds that scales only logarithmically with the number...
متن کاملParallel Sparse Triangular Solution with Partitioned Inverses and Prescheduled Dags
Sparse triangular solution ooers a challenging irregular problem for parallel systems. The repeated solution of the system Lx = b, where L is a lower triangular factor of a sparse matrix, arises in numerous applications. A previous study 1] has shown that, if L is an incomplete factor, Lx = b can be eeciently solved on a parallel system through substitution. This was accomplished by representin...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2007